Universal ε-approximators for integrals

نویسندگان

  • Michael Langberg
  • Leonard J. Schulman
چکیده

Let X be a space and F a family of 0, 1-valued functions on X. Vapnik and Chervonenkis showed that if F is “simple” (finite VC dimension), then for every probability measure μ on X and ε > 0 there is a finite set S such that for all f ∈ F , ∑ x∈S f(x)/|S| = [ ∫ f(x)dμ(x)]± ε. Think of S as a “universal ε-approximator” for integration in F . S can actually be obtained w.h.p. just by sampling a few points from μ. This is a mainstay of computational learning theory. It was later extended by other authors to families of bounded (e.g., [0, 1]-valued) real functions. In this work we establish similar “universal εapproximators” for families of unbounded nonnegative real functions — in particular, for the families over which one optimizes when performing data classification. (In this case the ε-approximation should be multiplicative.) Specifically, let F be the family of “k-median functions” (or k-means, etc.) on R with an arbitrary norm %. That is, any set u1, ..., uk ∈ R determines an f by f(x) = (mini %(x − ui)). (Here α ≥ 0.) Then for every measure μ on R there exists a set S of cardinality poly(k, d, 1/ε) and a measure ν supported on S such that for every f ∈ F , ∑ x∈S f(x)ν(x) ∈ (1± ε) · ( ∫ f(x)dμ(x)).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Choquet integral based models for general approximation

In this paper we study decision making models based on aggregation operators and, more specially, on Choquet integrals. The motivation of our work is to study the modeling capabilities of these operators and to build models that can approximate arbitrary functions. We describe and study two models that are universal approximators.

متن کامل

On DNF Approximators for Monotone Boolean Functions

We study the complexity of approximating monotone Boolean functions with disjunctive normal form (DNF) formulas, exploring two main directions. First, we construct DNF approximators for arbitrary monotone functions achieving one-sided error: we show that every monotone f can be ε-approximated by a DNF g of size 2n−Ωε( √ n) satisfying g(x) ≤ f(x) for all x ∈ {0, 1}. This is the first non-trivial...

متن کامل

Discriminative Restricted Boltzmann Machines are Universal Approximators for Discrete Data

This report proofs that discriminative Restricted Boltzmann Machines (RBMs) are universal approximators for discrete data by adapting existing universal approximation proofs for generative RBMs. Discriminative Restricted Boltzmann Machines are Universal Approximators for Discrete Data Laurens van der Maaten Pattern Recognition & Bioinformatics Laboratory Delft University of Technology

متن کامل

Neural Networks, Qualitative-Fuzzy Logic and Granular Adaptive Systems

Though traditional neural networks and fuzzy logic are powerful universal approximators, however without some refinements, they may not, in general, be good approximators for adaptive systems. By extending fuzzy sets to qualitative fuzzy sets, fuzzy logic may become universal approximators for adaptive systems. Similar considerations can be extended to neural networks.

متن کامل

Neural Networks That Are Not Sensitive To The Imprecision of Hardware Neurons

It has been proved that 3-layer neural networks can approximate any continuous function with any given precision (Hecht-Nielsen, Cybenko, Funahashi, Hornik, Stinchcombe, White). These theorems mean, in particular, that for a plant whose state can be described by finitely many parameters x1, . . . , xn, for an arbitrary control u = f(x1, . . . , xn), and for an arbitrary precision ε > 0, we can ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009